algorithmic justice league
Privacy meets Artificial Intelligence: The Intersection of Cybersecurity and Leadership - AI Trajectory 2023+
Much of humanity's future privacy is reliant on the promise, potential, and caveats of how we handle the power of AI in the coming years. AI is a powerhouse tool with immense potential to help humanity. As we activate and enable this tool in it's many forms, let's make sure that we protect our right to privacy, permission, bias-removal, and more. So… Is privacy still possible? How does "outside access" to your data, including criminal access, affect your world?
- Information Technology > Security & Privacy (1.00)
- Government > Military > Cyberwarfare (0.43)
These high school students are fighting for ethical AI
It's been a busy year for Encode Justice, an international group of grassroots activists pushing for ethical uses of artificial intelligence. There have been legislators to lobby, online seminars to hold, and meetings to attend, all in hopes of educating others about the harms of facial-recognition technology. It would be a lot for any activist group to fit into the workday; most of the team behind Encode Justice have had to cram it all in around high school. That's because the group was created and is run almost entirely by high schoolers. Its founder and president, Sneha Revanur, is a 16-year-old high-school senior in San Jose, California and at least one of the members of the leadership team isn't old enough to get a driver's license.
- North America > United States > California > Santa Clara County > San Jose (0.25)
- North America > United States > Massachusetts (0.06)
- North America > United States > Virginia > Fairfax County > Vienna (0.05)
- (3 more...)
Olivia P. Walker on LinkedIn: Olay takes on computer algorithms to fight biased beauty standards
Joy Buolamwini is not only intelligent, she is effective and absolutely stunning. From the article: "Olay is launching a new campaign to help end discriminatory computer algorithms that skew standards of beauty, per an announcement emailed to Marketing Dive. The effort coincides with National Coding Week (Sept. The Procter & Gamble-owned brand is also teaming with [computer scientist and] activist Joy Buolamwini, founder of the Algorithmic Justice League, to conduct an audit of its own practices."
Bias in facial recognition isn't hard to discover, but it's hard to get rid of
Joy Buolamwini is a researcher at the MIT Media Lab who pioneered research into bias that's built into artificial intelligence and facial recognition. And the way she came to this work is almost a little too on the nose. As a graduate student at MIT, she created a mirror that would project aspirational images onto her face, like a lion or tennis star Serena Williams. But the facial-recognition software she installed wouldn't work on her Black face, until she literally put on a white mask. Buolamwini is featured in a documentary called "Coded Bias," airing tonight on PBS.
Can Auditing Eliminate Bias from Algorithms? – The Markup
For more than a decade, journalists and researchers have been writing about the dangers of relying on algorithms to make weighty decisions: who gets locked up, who gets a job, who gets a loan--even who has priority for COVID-19 vaccines. Rather than remove bias, one algorithm after another has codified and perpetuated it, as companies have simultaneously continued to more or less shield their algorithms from public scrutiny. The big question ever since: How do we solve this problem? Lawmakers and researchers have advocated for algorithmic audits, which would dissect and stress-test algorithms to see how they work and whether they're performing their stated goals or producing biased outcomes. And there is a growing field of private auditing firms that purport to do just that.
- North America > United States > Oregon (0.05)
- North America > United States > New York (0.05)
- North America > United States (0.05)
- Asia > China (0.05)
Algorithmic Justice League - Unmasking AI harms and biases
In today's world, AI systems are used to decide who gets hired, the quality of medical treatment we receive, and whether we become a suspect in a police investigation. While these tools show great promise, they can also harm vulnerable and marginalized people, and threaten civil rights. Unchecked, unregulated and, at times, unwanted, AI systems can amplify racism, sexism, ableism, and other forms of discrimination. The Algorithmic Justice League's mission is to raise awareness about the impacts of AI, equip advocates with empirical research, build the voice and choice of the most impacted communities, and galvanize researchers, policy makers, and industry practitioners to mitigate AI harms and biases. We're building a movement to shift the AI ecosystem towards equitable and accountable AI.
What a Black tech movement might look like
Dr. Fallon Wilson is, like civil rights activist Fannie Lou Hamer, sick and tired of being sick and tired. Hamer and Wilson were both talking about a lack of progress on civil rights, but Wilson is talking specifically about data, AI, and tech from companies that have for years failed to make meaningful progress on diversity and inclusion initiatives. In a speech at the Kapor Center in Oakland, California, she said people cannot rely on companies like Facebook or Google to bring about meaningful change. "The truth is that the business of diversity and inclusion in tech companies will never eradicate structural racism, and I think we have to be clear about that," she said. "They cannot be the weathervane, nor should they, of what equitable progress looks like for Black people in this country as it relates to tech. Wilson was not referencing recent events like boycotts over Facebook's willingness to profit from hate or renewed diversity promises from Google and Microsoft.
- North America > United States > California > Alameda County > Oakland (0.25)
- North America > United States > New York > New York County > New York City (0.05)
- Law > Civil Rights & Constitutional Law (1.00)
- Information Technology (1.00)
Artificial Intelligence Can Be Biased. Here's What You Should Know.
Artificial intelligence has already started to shape our lives in ubiquitous and occasionally invisible ways. In its new documentary, In The Age of AI, FRONTLINE examines the promise and peril this technology. AI systems are being deployed by hiring managers, courts, law enforcement, and hospitals -- sometimes without the knowledge of the people being screened. And while these systems were initially lauded for being more objective than humans, it's fast becoming clear that the algorithms harbor bias, too. It's an issue Joy Buolamwini, a graduate researcher at the Massachusetts Institute of Technology, knows about firsthand. She founded the Algorithmic Justice League to draw attention to the issue, and earlier this year she testified at a congressional hearing on the impact of facial recognition technology on civil rights. "One of the major issues with algorithmic bias is you may not know it's happening," Buolamwini told FRONTLINE.
- North America > United States > Massachusetts (0.24)
- North America > United States > California > San Francisco County > San Francisco (0.05)
- Oceania > New Zealand (0.04)
- (2 more...)
Fighting the "coded gaze"
When I was a master's student at MIT, I worked on a number of different art projects that used facial analysis technology. One in particular--called The Aspire Mirror-- would detect my face in a mirror and then display a reflection of something different, based on what inspired me or what I wanted to empathize with. As I was working on it, I realized that the software I was using had a hard time detecting my face. But after I made one adjustment, the software no longer struggled: I put on a white mask. This disheartening moment brought to mind Franz Fanon's book Black Skin White Masks, which interrogates the complexities of changing oneself--putting on a mask to fit the norms or expectations of a dominant culture.